-
Notifications
You must be signed in to change notification settings - Fork 18
added ollama support for running local LLM's, Two new environment var… #43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughAdds Ollama LLM support and .env loading to the commit-msg CLI, introduces Changes
Sequence Diagram(s)sequenceDiagram
autonumber
actor U as User
participant CLI as commit-msg (CLI)
participant Env as .env loader
participant Git as Git Repo
participant Ollama as Ollama Server
participant OtherLLM as Other LLMs
participant CB as Clipboard
U->>CLI: Run commit-msg
CLI->>Env: Load .env (godotenv)
CLI->>Git: Validate repo, gather stats & diff
alt No changes
CLI-->>U: Exit (no changes)
else Changes detected
CLI->>CLI: Read env/config (COMMIT_LLM, OLLAMA_URL, OLLAMA_MODEL, API keys)
alt COMMIT_LLM == "ollama"
CLI->>Ollama: POST {model, prompt} to OLLAMA_URL
Ollama-->>CLI: Generated message / error
else other LLMs
CLI->>OtherLLM: GenerateCommitMessage(changes, apiKey)
OtherLLM-->>CLI: Generated message / error
end
alt Success
CLI->>CB: Copy commit message
CB-->>CLI: Copy result
CLI-->>U: Show stats + message + copy status
else Error
CLI-->>U: Show generation error
end
end
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Poem
Pre-merge checks and finishing touches❌ Failed checks (2 warnings)
✅ Passed checks (3 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (1)
internal/ollama/ollama.go (1)
46-61: Add an HTTP client with timeout for robustness.
http.Postuses the default client with no timeout, so the CLI can hang forever if the local Ollama endpoint is down or unreachable. Please use a client (or context) with a sensible timeout before issuing the request.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
go.sumis excluded by!**/*.sum
📒 Files selected for processing (3)
cmd/commit-msg/main.go(2 hunks)go.mod(1 hunks)internal/ollama/ollama.go(1 hunks)
🧰 Additional context used
🧬 Code graph analysis (2)
cmd/commit-msg/main.go (9)
internal/git/operations.go (2)
IsRepository(16-23)GetChanges(26-130)pkg/types/types.go (2)
Config(4-7)RepoConfig(10-13)internal/stats/statistics.go (1)
GetFileStatistics(14-74)internal/display/display.go (3)
ShowFileStatistics(20-96)ShowCommitMessage(99-115)ShowChangesPreview(118-133)internal/ollama/ollama.go (1)
GenerateCommitMessage(24-75)internal/chatgpt/chatgpt.go (1)
GenerateCommitMessage(13-32)internal/claude/claude.go (1)
GenerateCommitMessage(33-85)internal/grok/grok.go (1)
GenerateCommitMessage(15-87)internal/gemini/gemini.go (1)
GenerateCommitMessage(13-44)
internal/ollama/ollama.go (2)
pkg/types/types.go (1)
Config(4-7)pkg/types/prompt.go (1)
CommitPrompt(3-19)
|
hey @vinyas-bharadwaj can u resolve the conflicts |
DFanso
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hey @vinyas-bharadwaj can u resolve the conflicts
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
cmd/commit-msg/main.go (1)
124-143: Unify LLM dispatch on the already validatedcommitLLM.You already captured
commitLLM(and validated the relevant secrets) above, yet the generation flow re-readsCOMMIT_LLMfor each branch. This duplication makes the control flow harder to follow and risks future divergence if the environment is tweaked mid-run. Please reuse the storedcommitLLM(e.g., aswitchmirroring the earlier one) and move the Ollama URL/model defaults alongside that branch so all provider-specific setup lives in one place.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
cmd/commit-msg/main.go(2 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
cmd/commit-msg/main.go (2)
internal/git/operations.go (2)
IsRepository(16-23)GetChanges(26-130)internal/ollama/ollama.go (1)
GenerateCommitMessage(24-75)
|
I've resolved the issue with godotenv being imported but not used |
DFanso
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fork is outdated can u get the latest updates
| if url == "" { | ||
| url = "http://localhost:11434/api/generate" | ||
| } | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@vinyas-bharadwaj can u get a latest update from upstream/main i think your fork is outdated we moved from if else to switch 🙂
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
alright my bad, I'll get it fixed 😭
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thank you 🫡
87732a8 to
5f9a012
Compare
5f9a012 to
2f92ad6
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
go.sumis excluded by!**/*.sum
📒 Files selected for processing (3)
cmd/commit-msg/main.go(3 hunks)go.mod(1 hunks)internal/ollama/ollama.go(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- go.mod
🧰 Additional context used
🧬 Code graph analysis (2)
internal/ollama/ollama.go (2)
pkg/types/types.go (1)
Config(4-7)pkg/types/prompt.go (1)
CommitPrompt(3-19)
cmd/commit-msg/main.go (1)
internal/ollama/ollama.go (1)
GenerateCommitMessage(24-75)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (1)
internal/ollama/ollama.go (1)
14-17: Remove unused struct to reduce dead code.
OllamaRequestis defined but never used—lines 34-38 build the request body withmap[string]interface{}instead. This struct adds no value and may confuse future maintainers.Apply this diff to remove the unused struct:
-type OllamaRequest struct { - Model string `json:"model"` - Prompt string `json:"prompt"` -} -
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
internal/ollama/ollama.go(1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
internal/ollama/ollama.go (2)
pkg/types/types.go (1)
Config(4-7)pkg/types/prompt.go (1)
CommitPrompt(3-19)
🔇 Additional comments (1)
internal/ollama/ollama.go (1)
24-75: LGTM! Well-structured Ollama integration.The implementation correctly:
- Addresses the previous review by marking
configas intentionally unused- Defaults to
llama3:latestwhen no model is specified- Sets
stream: falsefor non-streaming responses- Reads the full response body before checking status (better error reporting)
- Validates non-empty responses
- Handles errors at each step
DFanso
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Approved 🎊
…iables OLLAMA_URL and OLLAMA_MODEL
Description
Type of Change
Related Issue
Fixes #30
Changes Made
Testing
Checklist
Screenshots (if applicable)
Additional Notes
For Hacktoberfest Participants
Thank you for your contribution! 🎉
Summary by CodeRabbit
New Features
UX / Bug Fixes
Chores